6 research outputs found

    Consistency Modeling in a Multi-Model Architecture : Integrate and Celebrate Diversity

    Get PDF
    Central to Model-Driven Engineering (MDE) is seeing models as objects that can be handled and organized into metamodel stacks and multi-model architectures. This work contributes with a unique way of doing consistency modeling where the involved models are explicitly organized in a multi-model architecture; a general model for creating multi-model architectures that allows semantics to be attached is defined and applied; explicit attachment of semantics is demonstrated by attaching Java classes that implement different instantiation semantics in order to realize the consistency modeling and the automatic generation of consistency data. The kind of consistency addressed concerns relations between data residing in legacy databases defined by different schemas. The consistency modeling is meant to solve the problem of exposing inconsistencies by relating the data. The consistency modeling combines in a practical way visual modeling and logic (OCL). The approach is not limited to exposing inconsistencies, but may also be used to derive more general information given one or more data sets. The consistency is modeled by defining a consistency model that relates elements of two given legacy models. The consistency model is expressed in a language specially designed for consistency modeling. The language allows definition of classes, associations and invariants expressed in OCL. The interpretation of the language is special: Given one conforming data set for each of the legacy models, the consistency model may then be automatically instantiated to consistency data that tells if the data sets are consistent or not. The invariants are used to decide what instances to generate when making the consistency data. The amount of consistency data to create is finite and limited by the given data sets. The consistency model is instantiated until no more elements can be added without breaking some invariant or multiplicity. The consistency data is presented as a model which can be investigated by the user

    Supporting fine-grained generative model-driven evolution

    Get PDF
    In the standard generative Model-driven Architecture (MDA), adapting the models of an existing system requires re-generation and restarting of that system. This is due to a strong separation between the modeling environment and the runtime environment. Certain current approaches remove this separation, allowing a system to be changed smoothly when the model changes. These approaches are, however, based on interpretation of modeling information rather than on generation, as in MDA. This paper describes an architecture that supports fine-grained evolution combined with generative model-driven development. Fine-grained changes are applied in a generative model-driven way to a system that has itself been developed in this way. To achieve this, model changes must be propagated correctly toward impacted elements. The impact of a model change flows along three dimensions: implementation, data (instances), and modeled dependencies. These three dimensions are explicitly represented in an integrated modeling-runtime environment to enable traceability. This implies a fundamental rethinking of MDA

    Modeling and Testing Legacy Data Consistency Requirements

    No full text
    Abstract. An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult. This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers the automatic checking of consistency in the context of one of the modeling techniques. The legacy model instances that are inputs to the consistency check must be represented in XMI.

    Copyright © Springer-Verlag Modeling and Testing Legacy Data Consistency Requirements

    No full text
    Abstract. An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult. This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers the automatic checking of consistency in the context of one of the modeling techniques. The legacy model instances that are inputs to the consistency check must be represented in XMI.
    corecore